Diary 2023-09-30
There's still some material that hasn't made it to Scrapbox yet.
AI digs up questions from past lectures
We're starting to see a little bit of that.
Devices for creating chunks
Devising ways to present to humans
Search Devices
This didn't come with a link.
It was April 2018, so I didn't quite know how to use Scrapbox yet!
https://gyazo.com/918471c6dd11a44549d949d3cd98542b
That's what the omni vector search found.
Scrapbox's link ambiguity search allows for "similar composition to ambiguity searches for past relevant links when trying to write a link.
Solving Scrapbox's "hard to generate value if you don't accumulate a lot of links" problem.
LLM allows computers as well as humans to implement in natural language
Human + computer = augmented human
FORTRAN(1954)
It hits part of the transcription and gets cut out.
Examples of [Cut out after needs are identified.
Delete the parts you don't want.
Whether to output search results to a page or not
It's not good for search results to go into the index.
Because the mapping of titles to fragments is spoiled.
Should search results be output?
The output is worth something because the output results are not read and discarded.
There are times when people say, "This is interesting, even though it doesn't show up in the summary."
On the other hand, reading "cut-up fragments" is rather cognitively taxing.
I really wish I could summarize it better.
Must be output and not be subject to search
→Innovation on how to create chunks
---
This page is auto-translated from /nishio/日記2023-09-30 using DeepL. If you looks something interesting but the auto-translated English is not good enough to understand it, feel free to let me know at @nishio_en. I'm very happy to spread my thought to non-Japanese readers.